Normal Linear Models with Lattice Conditional Independence Restrictions
نویسندگان
چکیده
Michael D. Perlman-It is shown that each multivariate normal model determined by lattice conditional independence (LeI) restrictions on the covariances may be extended in a, natural way to a normal linear model with corresponding lattice restrictions on the means. For these extended models it remains true that the likelihood function (LF) and parameter space (PS) can be factored into the products of conditional LF's and PS's, respectively, each factor being the LF or PS of an ordinary multivariate normal linear regression model, from which maximum likelihood estimators and likelihood ratio test statistics are readily obtained. This extends the classical MANOVA and GMANOVA models, where the linear restrictions on the means are less general but where no restrictions are imposed on the covariances, It is shown how a collection of nonnested dependent normal linear regression models may be combined into a single linear model by imposing a parsimonious set of LeI restrictions.
منابع مشابه
Marginal conditional independence models with application to graphical modeling
Conditional independence models are defined by a set of conditional independence restrictions and play an important role in many statistical applications, especially, but not only, graphical modeling. In this paper we identify a subclass of these models which are hierarchical marginal log-linear, as defined by Bergsma and Rudas (2002a). Such models are smooth, which implies the applicability of...
متن کاملNormal Linear Regression Models with Recursive Graphical Markov Structure*
A multivariate normal statistical model defined by the Markov properties determined by an acyclic digraph admits a recursive factorization of its likelihood function (LF) into the product of conditional LFs, each factor having the form of a classical multivariate linear regression model (≡ MANOVA model). Here these models are extended in a natural way to normal linear regression models whose LF...
متن کاملHigh Dimensional Graphs and Variable Selection with the Lasso
The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensio...
متن کاملHigh-dimensional Graphs and Variable Selection with the Lasso by Nicolai Meinshausen
The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensio...
متن کامل